Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

First draft of explainer, for early feedback #4

Merged
merged 4 commits into from
Aug 5, 2019

Conversation

kearwood
Copy link
Contributor

@kearwood kearwood commented Jun 5, 2019

No description provided.


"Lighting Estimation" is implemented by AR platforms using a combination of sensors, cameras, algorithms, and machine learning. Lighting estimation provides input to rendering algorithms and shaders to ensure that the shading, shadows, and reflections of objects appear natural when presented in a diverse range of settings.

The XRLightProbe and XRReflectionProbe interfaces expose the values that the platform offer to WebXR rendering engines. Their corresponding accessor functions, XRFrame.getGlobalLightEstimate() and XRFrame.getGlobalReflectionProbe() are only accessible once the first frame of an AR session has started. The promises may be resolved on the same frame or multiple frames later, depending on the platform capabilities. In some cases, the promises may fail, indicating that the lighting values are not available at this time and should be requested again at a later time.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can the page tell the difference between failure that might succeed later or failures that indicate that this platform simply doesn't support this feature for say, reflections?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of the promises failing if the lighting has not been calculated yet, could we just specify a standard set of values to return instead? That would save every app from having to create their own fallback lighting, and it would aid with consistency.


The orientation of the lighting information is relative to the XRViewerPose for the XRFrame that getGlobalLightEstimate() or getGlobalReflectionProbe() was requested on. As it may be computatinonaly expensive to rotate SH and texture cubes, XRLightProbe.sphericalHarmonicsCoefficients() and XRReflectionProbe.orientation() enable the same SH and texture cubes to be used in multiple orientations.

It is possible to treat a synthetic VR scene as the environment that AR content will be mixed in to. In this case, the platform will be able to report the lighting estimation using the geometry of the VR scene. As the WebXR API does not specifically express if the world is synthetic or real, AR content is to be written the same, without such knowledge. Such "AR in VR" techniques do not affect the WebXR specification directly and are beyond the scope of this text.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this proposing that we allow the UA to support this in vr mode? It wouldn't be a requirement right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This would be optional for UA implementations. In this case, the session would be an immersive-ar session. This would be driven by a UA specific UI, which requires no understanding of the "AR in VR" for the content's implementation.

### XRReflectionProbe

XRReflectionProbe should only be accessible with a permissions prompt equivalent to requesting access to the camera and microphone. XRReflectionProbe enables efficient and simple to implement image based lighting. PBR shaders can index the mip map chain of the environment cube to reduce the memory bandwidth required while integrating multiple samples to match wider NDF's.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand the reference to and microphone here. Why not just camera?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was imagining similar UX as requesting microphone permission, not necessarily relating the kind of sensor. It seems that this analogy is adding more confusion than helping, so perhaps I should remove the "microphone" reference.


Global illumination describes the collective techniques used to more accurately estimate the light received from indirect reflections.

## Image based lighting
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would "Cube Map textures" be a better title for this section?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"Image based lighting" implies not only the presence of cube map textures but also how they should be interpreted by a renderer. That said, the next section is labeled "Spherical Harmonics", which by itself does not imply how they should be interpreted by the renderer. At the least, we should have a glossary of these terms. Perhaps this could be discussed in the CG call to get consensus.


## Shadows

When an HDR Cube Map texture is available, shadows only have to consider occlusion of other rendered objects in the scene.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not quite sure what this sentence is saying... I'm pretty sure this is hard.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Doing this 100% correctly is very hard; however, there are many simple approximations that are commonly used. In particular, it may be sufficient in some cases to combine a baked ambient occlusion map with a IBR shader that needs no realtime dynamic lights.

A simple, non-physically based implementation may simply index the HDR cube map using a surface normal, and blend it with an albedo term representing the color of the surface using operators representing the artists intent.


### XRReflectionProbe

XRReflectionProbe should only be accessible with a permissions prompt equivalent to requesting access to the camera and microphone. XRReflectionProbe enables efficient and simple to implement image based lighting. PBR shaders can index the mip map chain of the environment cube to reduce the memory bandwidth required while integrating multiple samples to match wider NDF's.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we don't get permissions, can we still return a low-res version created from the output of XRLightProbe? That way we don't force anyone to implement spherical harmonics and directional lights if they just do IBL.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like this idea. I would like to break this out into its own issue for discussion.


Rendering algorithms take into consideration not only the light received by a surface from the light source but also light that has bounced around the scene multiple times before reaching the eye.

Traditional real-time engines had a simple global "ambient" constant value that is added to the real-time shading result. Engines using such a simple technique can use XRLightProbe.indirectIrradiance, scaled to return the desired effect. It may also be necessary to apply a gamma curve if the shading is done in SRGB space.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is indirectIrradiance a scalar value? It's listed as a Float32Array below.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

indirectIrradiance is intended to hold 3 values in the units described in "Physically Based Units". These values would represent the red, green, and blue components of the light. It seems that we are missing text to describe these components. When scaling indirectIrradiance, one would scale each of the components.


When an HDR Cube Map texture is available, shadows only have to consider occlusion of other rendered objects in the scene.

When a HDR Cube Map texture is not available, or the typical soft shadow effects of image based lighting are too costly to implement, the XRLightProbe.primaryLightDirection and XRLightProbe.primaryLightIntensity can be used to render shadows cast by the most prominent light source.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should primaryLightIntensity be primaryLightColor? A linear RGB triple would include intensity implicitly.


SH (Spherical Harmonics) are used as a more compact alternative to HDR cube maps by storing a small number of coefficient values describing a fourier series over the surface of a sphere. SH can effectively compress cube maps, while retaining multiple lights and directionality. Due to their lightweight nature, many SH probes can be used within a scene, be interpolated, or be calculated for locations nearer to the lit objects.

WebXR API supports up to 9 SH coefficients per RGB color component, for a total of 27 floating point scalar values. This enables the level 2 (3rd) order of details. If a platform can not supply all 9 coefficients, it can pass 0 for the higher order coefficients resulting in an effectively lower frequency reproduction.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any reason we specify a maximum number of coefficients here? Can that be platform dependent, since in the future we might be able to estimate more of them?


"Lighting Estimation" is implemented by AR platforms using a combination of sensors, cameras, algorithms, and machine learning. Lighting estimation provides input to rendering algorithms and shaders to ensure that the shading, shadows, and reflections of objects appear natural when presented in a diverse range of settings.

The XRLightProbe and XRReflectionProbe interfaces expose the values that the platform offer to WebXR rendering engines. Their corresponding accessor functions, XRFrame.getGlobalLightEstimate() and XRFrame.getGlobalReflectionProbe() are only accessible once the first frame of an AR session has started. The promises may be resolved on the same frame or multiple frames later, depending on the platform capabilities. In some cases, the promises may fail, indicating that the lighting values are not available at this time and should be requested again at a later time.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of the promises failing if the lighting has not been calculated yet, could we just specify a standard set of values to return instead? That would save every app from having to create their own fallback lighting, and it would aid with consistency.

```webidl
partial interface XRFrame {
Promise<XRLightProbe> getGlobalLightEstimate();
Promise<XRReflectionProbe> getGlobalReflectionProbe();
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want this API to be promise-based? If the lighting is estimated for a specific frame and we're returning promises, then the application would not be able to get the result of getGlobalLightEstimate() / getGlobalReflectionProbe() during request animation frame callback relevant for the frame for which the lighting was estimated. It would be able to act on the data (at the earliest) in a subsequent frame's rAF, and by that time, the estimate might be outdated.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One alternative would be to make this API subscription-based (similarly to what's described in existing hit test explainer).

@kearwood kearwood merged commit 71d4b76 into immersive-web:master Aug 5, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants